Implementing recurrent back-propagation on the connection machine

نویسنده

  • Etienne Deprit
چکیده

The recurrent buck-propagation ulgorithm for nearal nefwaorks has been implemerr~etl on the C’orlnecrion Machine. a massively purullel processor. Two fandamerr~ully different Lqq&~ architectlrrt~s underlying the nets Klere trsted: one bused on arcs. the other on node\. Confirming the predominunce of’ (,o1~11111(tIi(,(ltioll over compururion. peyformunce meusuremenls nnderscore Ihe necessi& lo make connecrions the basic, unir of representation. Cbmpurisons between ~hesegraph algorithms lead to imporlunt c~onclnsions concernirq the parallel implement&on o,f neurul nets in both sofiM*urt~ umi lmdwwr. Keywords-Neural networks. Recurrent back-propagation, Continuous mapping. Associ:ltive memory, Parallel processing. Massively parallel procesor, Parallel graph algorithms.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Online Voltage Stability Monitoring and Prediction by Using Support Vector Machine Considering Overcurrent Protection for Transmission Lines

In this paper, a novel method is proposed to monitor the power system voltage stability using Support Vector Machine (SVM) by implementing real-time data received from the Wide Area Measurement System (WAMS). In this study, the effects of the protection schemes on the voltage magnitude of the buses are considered while they have not been investigated in previous researches. Considering overcurr...

متن کامل

Simple Learning Algorithm for Recurrent Networks to Realize Short-Term Memories

A simple supervised learning algorithm for recurrent neural networks is proposed. It needs only O(n) memories and O(n) calculations where n is the number of neurons, by limiting the problems to delayed recognition (short-term memory) problem. Since O(n) is the same as the order of the number of connections in the neural network, it is reasonable for implementation. This learning algorithm is si...

متن کامل

Causal Back Propagation through Time for Locally Recurrent Neural Networks

This paper concerns dynamic neural networks for signal processing: architectural issues are considered but the paper focuses on learning algorithms that work on-line. Locally recurrent neural networks, namely MLP with IIR synapses and generalization of Local Feedback MultiLayered Networks (LF MLN), are compared to more traditional neural networks, i.e. static MLP with input and/or output buffer...

متن کامل

Casual BackPropagation Through Time for Locally Recurrent Neural Networks

This paper concerns dynamic neural networks for signal processing: architectural issues are considered but the paper focuses on learning algorithms that work on-line. Locally recurrent neural networks, namely MLP with IIR synapses and generalization of Local Feedback MultiLayered Networks (LF MLN), are compared to more traditional neural networks, i.e. static MLP with input and/or output buffer...

متن کامل

A Unifying View of Gradient Calculations and Learning for Locally Recurrent Neural Networks

In this paper a critical review of gradient-based training methods for recurrent neural networks is presented including Back Propagation Through Time (BPTT), Real Time Recurrent Learning (RTRL) and several specific learning algorithms for different locally recurrent architectures. From this survey it comes out the need for a unifying view of all the specific procedures proposed for networks wit...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural Networks

دوره 2  شماره 

صفحات  -

تاریخ انتشار 1989